AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
RedPajama dataset

# RedPajama dataset

Llammlein 7B
Other
LLäMmlein 7B is a German LLaMA language model with 7 billion parameters, trained from scratch on the German part of the RedPajama V2 dataset based on the adjusted Tinyllama codebase.
Large Language Model Transformers German
L
LSX-UniWue
251
2
Open Llama 13b
Apache-2.0
OpenLLaMA is an open-source reproduction of Meta AI's LLaMA large language model, offering pre-trained models with 3B, 7B, and 13B parameters
Large Language Model Transformers
O
openlm-research
1,300
455
Open Llama 3b
Apache-2.0
OpenLLaMA is an open-source reproduction of Meta AI's LLaMA large language model, offering pretrained models with 3B, 7B, and 13B parameter scales
Large Language Model Transformers
O
openlm-research
26.20k
157
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase